Is having a [high-end] video card important on a server?
Posted
by
Patrick
on Server Fault
See other posts from Server Fault
or by Patrick
Published on 2010-12-22T15:28:38Z
Indexed on
2010/12/22
15:56 UTC
Read the original article
Hit count: 297
terminal-server
|video-card
My application is quite interactive application with lots of colors and drag-and-drop functionality, but no fancy 3D-stuff or animations or video, so I only used plain GDI (no GDI Plus, No DirectX).
In the past my applications ran in desktops or laptops, and I suggested my customers to invest in a decent video card, with:
- a minimum resolution of 1280x1024
- a minimum color depth of 24 pixels
- X Megabytes of memory on the video card
Now my users are switching more and more to terminal servers, therefore my question:
What is the importance of a video card on a terminal server?
- Is a video card needed anyway on the terminal server?
- If it is, is the resolution of the remote desktop client limited to the resolutions supported by the video card on the server?
- Can the choice of a video card in the server influence the performance of the applications running on the terminal server (but shown on a desktop PC)?
- If I start to make use of graphical libraries (like Qt) or things like DirectX, will this then have an influence on the choice of video card on the terminal server?
- Are calculations in that case 'offloaded' to the video card? Even on the terminal server?
Thanks.
© Server Fault or respective owner